A MEMORY EFFICIENT INCREMENTAL GRADIENT METHOD FOR REGULARIZED MINIMIZATION

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A coordinate gradient descent method for ℓ1-regularized convex minimization

In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing `1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated a...

متن کامل

A regularized limited-memory BFGS method for unconstrained minimization problems

The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since LBFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iterat...

متن کامل

Parallel Coordinate Descent Newton Method for Efficient $\ell_1$-Regularized Minimization

The recent years have witnessed advances in parallel algorithms for large scale optimization problems. Notwithstanding demonstrated success, existing algorithms that parallelize over features are usually limited by divergence issues under high parallelism or require data preprocessing to alleviate these problems. In this work, we propose a Parallel Coordinate Descent Newton algorithm using mult...

متن کامل

Efficient algorithm for regularized risk minimization

The recently proposed Optimized Cutting Plane Algorithm (OCA) is an efficient method for solving large-scale quadratically regularized risk minimization problems. Existing open-source library LIBOCAS implements the OCA algorithm for two important instances of such problems, namely, the Support Vector Machines algorithms for training linear two-class classifier (SVM) and for training linear mult...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bulletin of the Korean Mathematical Society

سال: 2016

ISSN: 1015-8634

DOI: 10.4134/bkms.2016.53.2.589